Efficient k-Support-Norm Regularized Minimization via Fully Corrective Frank-Wolfe Method
نویسندگان
چکیده
The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost thus the model solving could be time consuming. In our work, we reformulate the k-support-norm regularized formulation into a constrained formulation and propose a fully corrective Frank-Wolfe type algorithm to minimize the constrained model. The convergence behavior of the proposed algorithm is analyzed. Experiment results demonstrate the use of k-support-norm and superior efficiency of the proposed algorithm.. Examination Committee: Professors Dimitris Metaxas, Kostas Bekris, Konstantinos Michmizos and Vinod Ganapathy
منابع مشابه
Fast column generation for atomic norm regularization
We consider optimization problems that consist in minimizing a quadratic function under an atomic norm1 regularization or constraint. In the line of work on conditional gradient algorithms, we show that the fully corrective Frank-Wolfe (FCFW) algorithm — which is most naturally reformulated as a column generation algorithm in the regularized case — can be made particularly efficient for difficu...
متن کاملA Deterministic Nonsmooth Frank Wolfe Algorithm with Coreset Guarantees
We present a new Frank-Wolfe (FW) type algorithm that is applicable to minimization problems with a nonsmooth convex objective. We provide convergence bounds and show that the scheme yields so-called coreset results for various Machine Learning problems including 1-median, Balanced Development, Sparse PCA, Graph Cuts, and the `1-norm-regularized Support Vector Machine (SVM) among others. This m...
متن کاملEfficient ℓq Minimization Algorithms for Compressive Sensing Based on Proximity Operator
This paper considers solving the unconstrained lq-norm (0 ≤ q < 1) regularized least squares (lq-LS) problem for recovering sparse signals in compressive sensing. We propose two highly efficient first-order algorithms via incorporating the proximity operator for nonconvex lq-norm functions into the fast iterative shrinkage/thresholding (FISTA) and the alternative direction method of multipliers...
متن کاملLinear Convergence of a Frank-Wolfe Type Algorithm over Trace-Norm Balls
We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over a trace-norm ball. Our algorithm replaces the top singular-vector computation (1-SVD) in Frank-Wolfe with a top-k singular-vector computation (k-SVD), which can be done by repeatedly applying 1-SVD k times. Our algorithm has a linear convergence rate when the objective function is smooth and str...
متن کاملFitting Spectral Decay with the \(k\)-Support Norm
The spectral k-support norm enjoys good estimation properties in low rank matrix learning problems, empirically outperforming the trace norm. Its unit ball is the convex hull of rank k matrices with unit Frobenius norm. In this paper we generalize the norm to the spectral (k, p)support norm, whose additional parameter p can be used to tailor the norm to the decay of the spectrum of the underlyi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016